Search Results: "Martin F. Krafft"

16 August 2010

Martin F. Krafft: Happy birthday Debian

Dear Debian: I haven t had much of a chance to stay in touch lately, but I don t want to forget to wish you well on this 17th birthday of yours. You have set standards and you continue to do so. You are the operating system of choice, and you excel at it. Keep up the level of quality, and keep up the spirit. I am looking forward to more contact in the future. Thanks to everyone who has dedicated and (or) continues to dedicate their time to our project! Love, -m

25 April 2010

Russell Coker: Links April 2010

Sam Harris gave an interesting TED talk about whether there are scientific answers to moral questions [1]. One of his insightful points was that when dealing with facts certain opinions should be excluded it would be good if journalists who report on issues of science could understand this. Another insight was that religious people most strongly agree with him regarding the issue of whether there are factual answers to moral questions but they think that God just handed the answers to their ancesters rather than making it an issue that requires consideration. He cites the issue of gay marriage as being a distraction from moral issues such as genocide and poverty. He asks how have we convinced ourself that every culture has a point of view worth considering? . He asks how the ignorance of the Taliban on the topic of physics is any less obvious than on the topic of human well-being. Dan Gilbert gave an insightful TED talk titled Why Are We Happy? [2]. One interesting fact he cites is that people who become paraplegic are no less happy in the long term than people who win the lottery. He points out that a shopping mall full of Zen monks is not going to be particularly profitable and uses this fact to explain the promotion of natural happiness over synthetic happiness in our society. Dan Barber gave an amusing and informative TED talk How I Fell in Love with a Fish [3]. He speaks about ecological fish farming and how the fish are more tasty as well as the farm being good for the environment. The farm in question is in the south-west of Spain, hopefully there will be more similar farms in other parts of the world soon. Gary Lauder gave an interesting brief TED talk about road signs [4]. His main point was to advocate a road sign saying take turns , but there are already signs in the US at freeway on-ramps saying that 1 or 2 cars may enter every time the light turns green which is a similar concept. The innovative thing he did was to estimate the amount of time and petrol wasted by stop signs, add that over a year based on the average income and then estimate that an annuity covering that ongoing expense would cost more than $2,000,000. This makes two stop signs at an intersection have an expense of $1,000,000 each. He suggests that rather than installing stop signs it would be cheaper to buy the adjacent land, chop down all trees, and then sell it again. Alan Siegel gave an insightful TED talk about simplifying legal documents [5]. He gives an example of an IRS document which was analysed with a Heat Map to show which parts confused the readers the IRS adopted a new document that his group designed which made it easier for taxpayers. He advocates legislation to make legal documents easier to understand for customers of financial services. Tim Berners Lee gave an interesting TED talk about Open Data, he illustrated it with some fantastic videos showing how mashups have been used with government data [6] and how the Open Street Map project developed over time. Martin F. Krafft gave an interesting Debconf talk about Tool Adoption Behavior in the Debian project [7]. One thing that I found particularly interesting was his description of the Delphi Method that he used to assemble a panel of experts and gather a consensus of opinion. The post-processing on this talk was very good, in some sections Martin s presentation notes are shown on screen with the video of him in the corner. As an aside, I think we really do need camera-phones. The Big Money has an interesting article comparing the Mafia Bust Out with the practices of US banks [8]. Mark Roth gave an exciting TED talk about using Hydrogen Sulphide to trigger suspended animation [9]. They are now doing human trials for suspending people who have serious injuries to reduce tissue damage during the process of surgery. Pawan Sinha gave an interesting TED talk about how brains learn to see [10]. He started by talking about curing blindness in people who have been blind since birth. But he then ended by showing some research into the correlation between visual processing and Autism, he showed that an Autistic child had significantly different visual patterns when playing Pong to an NT child. Adora Svitak gave an insightful TED talk about what adults can learn from kids [11]. She made some particularly interesting points about the education system requiring that adults respect children more and expect them to do better than their parents which is essential for all progress in society. The NY Times has an interesting article on animal homosexuality [12]. In terms of research it focusses on lesbian relationships between albatrosses. But a large part of the article is devoted to the politics of scientific research into animal sexuality. BrowserShots.org shows you what your web site looks like in different web browsers [13]. Cory Doctorow wrote an insightful article titles Can You Survive a Benevolent Dictatorship about the Apple DRM [14]. He describes the way the Apple Digital Restrictions Management (DRM) doesn t stop copyright violation but does reduce competition in the computer industry. He is not going to sell his work on the Apple store (for the iPad or iPhone etc) and suggests that customers should choose a more open platform. It s unfortunate that he didn t suggest a better platform.

19 April 2010

Martin F. Krafft: Orangutans at the Nestl shareholder meeting

Bravo Greenpeace Switzerland! At Nestl s annual shareholder meeting 2010 last week, you descended from the ceiling in the middle of the presentations with flyers and a banner asking for the company to take responsibility for their reckless actions in Indonesia. Thousands of square kilometres of forest are cleared every day so that companies like Nestl can make vast sums of money off consumers. Orangutans asking Nestl  for a break Meanwhile, Orangutans outside the venue were protesting Nestl and asking for a break (copying Nestl s own slogan Have a Break! Have a ). The Orang Utans are pushed towards extinction by capitalist interest. One of my closest friends was part of the act, and he recounts breaking into the ventilation system before sawing through the ceiling, and descending on a rope. The police detained them for more than 24 hours, but the message has been sent. Bravo! Read more (and watch videos of the spectacular descent) on the Greenpeace webpage, the Greenpeace press announcement and their blog (all in German), or on 24heures (in French). Planetsave has a decent coverage in English. NP: Emerson, Lake & Palmer: Brain Salad Surgery

17 April 2010

Martin F. Krafft: Planes or volcano

Via internotes.ch: Planes produce way more CO  than the volcano NP: Emerson, Lake & Palmer: Tarkus

11 March 2010

Martin F. Krafft: Splitting puppetd from puppetmaster

My relationship with Puppet is one of love and hate. I am forced to use it simply because there is no better tool around, but I hate it in so many ways that I don t even want to start to enumerate (hint: most have to do with Ruby, actually). Today I decided to put an end to one thing that has been driving me insane: the fact that puppetd (the client) and puppetmasterd (the server) use the same working directory, /var/lib/puppet. Since I consider and would like to treatthe machine on which puppetmasterd is running just another puppet client, I was running into funky issues related to SSL certificate confusion, obscure errors, and SSL revocation horrors. The following hence assumes that you have installed or are planning to install puppetd on the machine running your puppetmaster, and that you have two fully-qualified domain names for the machine. For instance, I run puppetmaster on vera.madduck.net, and puppetmaster.madduck.net is an alias for the same machine. I ll use these names in the following as examples. The following may be Debian-specific, as I am solely using the puppet and puppetmaster packages for my experimentation and verification. Your mileage may vary, but the concept shall be the same.
  1. Stop everything:
    /etc/init.d/puppetmaster stop
    /etc/init.d/puppet stop
    
    
    (also verify that you have not instructed cron to restart these services)
  2. Rename the working directory:
    mv /var/lib/puppet /var/lib/puppetmaster
    
    
    and amend /etc/puppet/puppet.conf accordingly:
    [main]
    #  
    vardir=/var/lib/puppetmaster
    ssldir=$vardir/ssl
    #  
    [puppetmasterd]
    certname=puppetmaster.madduck.net
    #  
    
    
    I am doing this in [main], planning to override it for puppetd later, because puppetd is the only program which makes sense to be separated from the rest. Since only the puppetmaster needs a special certificate name, that is set specifically in the [puppetmasterd] section. If you use apache2 or nginx in front of your puppetmasters, make sure to amend the SSL file locations in the virtual host definition and restart (!) the service. You can verify that the configuration has been amended by making sure that there is no output from the following command:
    # puppetmasterd --genconfig   grep -q '/var/lib/puppet/' && echo SOMETHING IS WRONG
    
    
  3. Now restart puppetmaster:
    /etc/init.d/puppetmaster start
    
    
    and verify that it starts. If your puppetmaster previously ran under a different name, it will create itself a new certificate and sign it. Since the client will get its own working directory (and thus a new SSL certificate), you want to remove all records of the old certificate:
    # puppetca --list --all
    + puppetmaster.madduck.net
    + vera.madduck.net
    # puppetca --clean vera.madduck.net
    
    
  4. Change the configuration file to tell puppetd about its working directory:
    [puppetd]
    server=puppetmaster.madduck.net
    vardir=/var/lib/puppetmaster
    ssldir=$vardir/ssl
    #  
    
    
    This you can verify with the following command, which should not print anything:
    # puppetd --genconfig   grep -q '/var/lib/puppet[^/]' && echo SOMETHING IS WRONG
    
    
  5. Now install puppet, or (re)start it if it s already installed:
    # /etc/init.d/puppet stop
    # puppetd --no-daemonize --onetime --verbose --waitforcert 30 &
    info: Creating a new SSL key for vera.madduck.net
    warning: peer certificate won't be verified in this SSL session
    info: Caching certificate for ca
    info: Creating a new SSL certificate request for vera.madduck.net
    # puppetca --list
    vera.madduck.net
    # puppetca --sign vera.madduck.net
    notice: Signed certificate request for vera.madduck.net
    notice: Removing file Puppet::SSL::CertificateRequest vera.madduck.net at '/var/lib/puppetmaster/ssl/ca/requests/vera.madduck.net.pem'
    # fg
    info: Caching certificate for vera.madduck.net
    info: Caching certificate_revocation_list for ca
    [ ]
    # puppetca --list --all
    + puppetmaster.madduck.net
    + vera.madduck.net
    # /etc/init.d/puppet start
    
    
    Do yourself the favour and check that it s all working.
  6. Optionally, you can now clean up the client stuff in the server s working directory, for instance like this (it worked for me, but this is the sledgehammer approach):
    # /etc/init.d/puppetmaster stop
    # cd /var/lib/puppetmaster
    # tar -cf /tmp/puppetmaster.workingdir-backup.tar .
    # find ../puppet -type f -printf '%P\n'   xargs rm
    # /etc/init.d/puppetmaster start
    
    
  7. If you stopped cron before (and your puppet recipes have not since restarted it):
    /etc/init.d/cron start
    
    
All done. I wish puppet, or at least Debian s puppet packages would do this by default. Please let me know if the above conversion works for you. Then I might start working on an automated migration. NP: Genesis: Selling England by the Pound

26 February 2010

Martin F. Krafft: ACTA leak: no surprises about transparency blockers

The most common criticism of the Anti-Counterfeiting Trade Agreement (ACTA) is the lack of transparency. Before the nations disclose the terms of the agreement under negotiation, we are unable to gain an idea of the big picture, let alone voice our opinions and push for changes. Our politicians don t want us to know. We rely on leaked documents for our information. This is backwards in a world where a state should represent its people. This smells foul to me. There are undoubtedly some good reasons for the treaty, and if we can contain worldwide, large-scale trade of counterfeited goods and medicine, then that would be a net benefit to us all. However, we must not allow certain governments to succomb to the pressure of (commercially-motivated) lobbyists, to extend that pressure onto other nations using trade as a means of pressure, and to slash our freedom as if it were an inconvenient obstacle in their way. Only if the terms under negotiation become publicly available, and the public is given a voice, then we can help our governments in entering an agreement that is in the interest of its people, rather than a threat to us. It is hardly surprising that total capitalist nation USA are the strongest opponents of transparency, because the public might delay or even prevent the treaty. I was also not surprised to see South Korea and Germany in the list of supporters of secrecy either. It is interesting to see that the leaders of Singapore, Belgium, Portugal, and Denmark also seem to believe that these negotiations should be withheld from the public. Does anyone know about Switzerland? I tip my hat to New Zealand, Canada, Australia, Netherlands, Sweden, Finland, Ireland, Hungary, Poland, Estonia, and Austria for their support of transparency.

20 February 2010

Martin F. Krafft: Charge advertisers for the last mile

ISPs fight a raging war over net neutrality because their infrastructure cannot keep up with the increasing demand (or rather supply) of content. Therefore, ISPs want to charge the users premiums if they wish to use certain services on the Net. For instance, since videos are usually large in size, one would have to purchase e.g. the platinum package to be able to access video hosting sites. It would be a serious loss of freedom if they won, and the Internet would never be the same. Let s turn that idea around: since sites that use advertising make money off every visitor, they are really the ones that should pay the ISPs so that they can improve their infrastructure. The same applies to sites that make money off visitors in other ways. At the moment, users pay to access the network (which is like paying a taxi to get to the market), so that they can visit sites where advertisers make money showing ads to the visitor, which might actually let them to pay a manufacturer for a product the end user pays twice, and the advertisers take in money, leeching off the ISPs investing into their infrastructure. I think that the advertiser and not the consumer should pay the ISP to keep the infrastructure afloat improve it even. The manufacturer should then pay the advertisers for displaying the ad, and the user consumes if s/he chooses to and everyone only pays once, for services they want. This will help improve competition among providers, which should always be the goal. If my ISP would start to record the volume of HTTP traffic I produce for each target site, charge the targets appropriately (they could start with a couple at first), and I d get free connectivity in turn, I d be quite happy. The ISP wouldn t have to look at the contents at all for that. I don t yet know what to do if the target sites choose not to pay up. ISPs could block them, or throttle or deprioritise traffic, but either of those might simply lead to an exodus of users, just like premiums would. As usual, this just needs to be done by many ISPs in concert. Are you listening?

Martin F. Krafft: Making money off ethics

The coffee place around the corner from where Penny and I lived for the past two months Caff Mode offers to make your food using free-range eggs for NZ$1. Free-range eggs are more expensive than normal ones, but the price difference is not one dollar. Therefore, the cafe makes a profit every time a customer makes the right choice. I went in this morning to ask them about it, and the guy taking my coffee order admitted stale-mate. When I suggested that the cafe should use free-range eggs exclusively, he agreed. Let s hope that he lets those making that decision know, and that the cafe soon stops making money on ethical choices.

18 February 2010

Martin F. Krafft: Thank you, Catalyst!

Tomorrow, Penny and I head off back home, and two months of living in NZ come to an end. (did you hear that, pleaserobme.com?) Maybe I ll find the time to write about my impressions of living on this side of the planet, and being immersed in Kiwi culture while going after my daily routine and trying to work as much as I could. But there is one thing that should not wait: Thank you, Catalyst IT for giving us workspaces! For the better part of 6 weeks, you gave us our own room, monitors, keyboards, mice, and connectivity. And more than that: you welcomed us, let us participate in sessions, invited us to your parties, received our parcels, sent out letters, and generally provided us with a great environment to work. This was certainly well above what we had dreamed of. At times, I was forced to stay into the middle of the night 12 hours time difference with Europe is not always easy and spent waking hours in your building alone. Thank you for your trust! Catalyst is a fully New Zealand owned company who deliver critical open source business systems to some of NZ s largest organisations, and organisations worldwide. Catalyst was also a major enabler of LCA2010, and a sponsor of Kiwi Foo Camp, both events that I had the privilege to attend. Let me know when you re in my part of the world. ;) NP: The Mamaku Project: Karekare

17 February 2010

Martin F. Krafft: ACTA documents leaked

Shortly after I wrote my last article about ACTA and the lack of transparency, I was delighted to find out that a report of the recent negotiations in Mexico has been leaked. I find it a bit disconcerting that our politicians, who are theoretically supposed to represent our interests, are writing documents that can leak to the public, when they should have been available to the public from the start. The document and the coverpage are available for direct download. Michael Geist has a first analysis
A brief report from the European Commission authored by Pedro Velasco Martins (an EU negotiator) on the most recent round of ACTA negotiations in Guadalajara, Mexico has leaked, providing new information on the substance of the talks, how countries are addressing the transparency concerns, and plans for future negotiations. (read more )
NP: Dimmer: Degrees of Existence

Martin F. Krafft: Privacy discussion mailing list

Dear lazyweb: I am in search of a mailing list for discussion on matters related to digital identity and privacy in the information age. Unfortunately, my (limited) searching has not unveiled results, mostly because many mailing lists have privacy agreements or somesuch, polluting the results with pointers to those. If you know such a list, or you don t but you are interested in the topic, don t hesitate to drop me a line. I will then either let you know when my search was successful, or subscribe you when I have created a list to fill the void. NP: Sola Rosa: Solarized

13 February 2010

Martin F. Krafft: ACTA: less knowledge means less resistance

Right now, your government is probably engaged in the discussion of the Anti-Counterfeiting Trade Agreement (ACTA). You are likely not aware of that because your government has been actively keeping these negotiations and details surrounding them secret. Your government does not want you to know about a treaty that has far-reaching negative effects on your freedom, as well as your basic human rights. If you did know, you might speak up and make it difficult for the drivers of ACTA to smoothly push their interests past you. The red light should light up in your head right now! The goals of the trade agreement that is being negotiated are multifarious, but essentially seem to centre around challenges related to intellectual property, and copyright in the digital age, even though it is sometimes claimed that the agreement serves primarily to contain trade of fake Prada bags and Rolex watches. In reality, ACTA is about content producers like movie studios, who try everything to prevent you from copying their work without paying for it even if you cannot actually purchase the work, because of e.g. technical measures designed to prevent certain people from legally obtaining content, or simply because the media companies are greedy and consider it PR-savvy to delay the release of a given work in certain countries until after people have had a chance to pay a lot of money to the cinemas. In theory, a creative work goes out of copyright 50 or 75 years after its author died, depending on whether the creativity can be attributed to a person or a corporation, respectively. Therefore, 50 or 75 years after creation, it gets increasingly hard to monetise a work that has not been reinvented in that period of time. Sounds plausible to you and me, but this sort of stuff frightens companies like Disney, who seem powerful enough to simply have the law changed. That is not how things should work. The media producers are failing to control the Internet, and hence they want to turn it into something more like cable TV, which they do know how to control. ACTA aims to make copyright infringement a criminal offence. ACTA wants to make it possible for a government to cut you off the Internet because someone thinks you did something bad they don t actually have to prove it though, accusation is enough. Similar efforts have already failed all over the world, e.g. in France and New Zealand. That s a sign, not a reason to try again. ACTA wants to set in stone that you have absolutely no rights when you cross borders. This is largely already the case border officials can pretty much do with you whatever they want now it s supposed to be made official, and legally binding. ACTA will create a culture of surveillance and suspicion. ACTA is designed to break the Internet, among other things. But worst of all: I am just speculating because we are not supposed to know the details. The best current source of information on ACTA seems to be Canadian law professor Dr. Michael Geist, who has been collecting content and linking to articles consistently since the ACTA negotiations commenced 2 3 years ago. The Electronic Frontier Foundation also has comprehensive resources available. It s even more important today than before to put an end to this secrecy. Don t let your government enter secret agreements that affect you and your life, refusing to talk to you about it beforehand, and probably refusing all responsibility afterwards. Talk to your politicians and ask questions that cannot be answered with stock replies. If you have specific contact addresses for politicians, please let me know so I can add them. Colin Jackson helped me with this article at Kiwi Foo Camp. He also takes an issue with the secrecy around ACTA.

9 February 2010

Martin F. Krafft: Baffling Exchange

I found out yesterday that my university s Microsoft Exchange Server account stopped forwarding my mail on 8 December 2009. As a result, mail accumulated there and remained unseen. Dear examiners, paper authors, supervisors, sponsors, participants, and peers who responded to my calls and cries related to my PhD thesis. I am terribly sorry that you were subjected to this. You replied usually within a few days, but I still sent you reminder after reminder in the weeks to follow. You must have thought that I was a real dork. Please forgive me. I really appreciate your patience! I filed a ticket with my university s IT service provider, which got closed the next day with it should now work again . That wasn t going to cut it for me, so I reopened the ticket, asking for an explanation. Next, I received an apology with a bit of speculation. After a bit of research, it seems that the reason was to be found in the inconsistency of being an external staff member (i.e. an e-mail address outside of the Active Directory domain), but still having an account on the server. On 8 December 2009, the server was upgraded with a service pack. This caused Exchange to go a little manic on the housekeeping. After all, why would anyone ever want to forward their e-mail elsewhere, and still have an account? Well, I certainly don t want an account, and yet I have to have one: Microsoft has bought their ways into the university and spread their germs all across, I need credentials to be able to access shared files, print, browse the library, or search the phone book. However, considering that UL s Outlook Web Access instance does not let users of decent browsers search their mailboxes (a premium feature reserved for users of Internet Exploder), one cannot manipulate more than a page-full of e-mails at a time, bounce messages, or do many of the other operations that make dealing with large amounts of e-mail possible, and because Exchange mail if it doesn t get lost in the first place sucks in so many other ways, I certainly prefer my mail to be handled by a real mail server with a proper mail filter (writeup in progress). Maybe the Exchange service pack was simply designed to get rid of outcasts like me who don t buy into the low-Microsoft-quality vendor lock-in? We ll never know, thanks to the proprietarity of their software (and the fact that the university service provider apparently does not keep logs of changes). NP: AC/DC: Back in Black

Martin F. Krafft: Sign me up to social networking!

I do not like it when people tell Web 2.0 sites to send me invitation e-mail. I won t enumerate the reasons here. But there is one reason for why I don t like you passing on my address to those sites, which is subject of this article: Unlike popular belief, the Web 2.0 is not a money-printing machine. It s a long road until you can actually generate real money with user content. Therefore, some shadey sites are probably selling contact details to advertisers to make ends meet while hoping for the big cashflow. I don t have any data to back this up, and I want to change that:
Please tell all your Web 2.0 sites to send me an invitation! Please use an address in the signmeup.madduck.net domain for that, and make sure to include the domain name of the service to which you sign me up before the @ symbol. Also append a hyphen/dash and a random, short string. More on that in just a sec. For instance, if you are one of those people that believes that letting people know where you are (and have been) at any point in time, tell Foursquare to send an invitation to:
foursquare.com-ponies@signmeup.madduck.net
The reason for the random, short string ( ponies ) is simply so that I can later cross-check that a message receiving spam actually went through a social networking site I intend to catalog the invitation messages.
Thank you for your time. Keep in mind: the more, the merrier. I ll make sure to report back on the outcome of this little experiment right here, so watch this space. NP: Billy Joel: Cold Spring Harbor

7 February 2010

Martin F. Krafft: Optimise Google

I had previously sought alternative, innovative search engines, but none of the proposed options made me particularly happy. About a year ago, I came across DuckDuckGo, and today, I ve been using DDG as my primary search provider for exactly 10 months. The reasons why I switched included I am aware that DuckDuckGo is index-based itself, using the Yahoo API, which, in turns means that DuckDuckGo may already be using Bing data. Sounds a bit like out of the frying pan into the fire, unfortunately. I am still investigating better search solutions, sticking with DuckDuckGo meanwhile. Unfortunately, DuckDuckGo doesn t quite cut the mustard at all times, forcing me to go to Google instead. For this reason I am glad to find that the CustomizeGoogle Firefox extension has not been discontinued, but simply renamed to OptimizeGoogle. This extension allows me to anonymise my identity towards Google, remove click tracking (which Google doesn t want you to know about and hence hide with JavaScript), hide ads, and customise a slew of other aspects of the giant s search engine. It alleviates some of the aforementioned concerns, but not all. Maybe it s time to rethink the way I use the web and lower my search needs. If you are using Firefox, try it out! If you re still using Internet Exploder, you should not, and instead upgrade to Firefox. Users of other browsers might find similar functionality for their application, or might want to switch as well. NP: Tunng: Comments from the Inner Chorus

6 February 2010

Martin F. Krafft: Of waterfalls and communication culture

I got involved with open-source software before I learnt about software development in a university course. Naturally, when my profs tried to teach the waterfall model to me, I couldn t take them too seriously back then. After all, requirements specification design implementation verification maintenance is not really in line with the principle to release early, release often. Furthermore, since water cannot flow uphill, the waterfall model fails to represent development cycles, as they naturally appear, even in behemoth, ancient software nightmares. And yet, when embarking on a new project, I do tend to find myself first thinking about the big picture, instead of churning out the code. I am certainly not the best coder out there, and it might well be that I could benefit from learning to break down problems to get an earlier start on the implementation of components. However, I maintain that avoiding the waterfalls and engaging directly in extreme programming, agile software development, or pair-based approaches right away is not the answer. Rather, the best approach should probably involve a certain level of conceptualisation before code is produced. I am a big fan of test-driven development, and I like the scrum method for the very reason that it involves talking and challenging ideas (although I wouldn t follow the method down to the book). I like to think about trickles in the mountains where water droplets joyfully jump around. * * * When Glyn Moody spoke in his LCA2010 keynote about challenges we (as in society) face, and how open-source seems to have many answers, he dropped the following gem, which spoke right to my heart:
Twitter is the release early, release often principle applied to thinking.
By this simile, journal articles are produced according to the waterfall model. This may well be why they are usually outdated at the time of publication. Microblogging (like Twitter), on the other hand, is primarily used to publish stuff before it s ready, and which would never be published otherwise. With journals on one end, and microblogging on the other, I think the epiphany is found in between as with software development: web logs web applications that allow for easy publishing by anyone (which is a different problem not to be discussed here). Since articles on those platforms usually have at least a title and a body, they require just a little bit more thought than 140 characters of contracted brain farts, spilled into the world faster than it takes one to stand up, stretch, and sit down again. * * * Microblogging seems to be in line with where we re heading: more information, more self-promotion, more access to more people, and all that with lower barriers of entry. It s hard to argue against a trend, but I think we ve taken a wrong turn somewhere. The one specific instance of content is no longer relevant, and there is no more time in the day to read elaborate treatments of subject matters. Instead, what seems to prevail is a constant flow. This flow threatens to replace actual thinking and discourse, both of which require reflection and time a scarce resource used up by ever new, fast-flowing media. It seems to me that those who immersed in this flow are unable to get out, as if sucked in by a maelstrom. I ve seen people enter serious withdrawal within hours of not knowing what s going on in the world. One could miss out on something. If you re following people on one of those microblogging platforms, I challenge you to spend the weekend offline and when the urge hits, ask yourself what you are actually missing. I mean what you are really missing, and by that I mean anything other than the cozy buzz and hum of entertainment washed upon you, preventing you from having to think about what you could be (actively) doing instead. I hope it s not a lot. For else, I fear that this means that future generations will be stuck with this communication culture, just like water droplets can t ever play in the mountain trickle again. NP: Sola Rosa: Get It Together

31 January 2010

Axel Beckert: abe@debian.org

On Wednesday I got DAM approval and since Saturday late evening I m officially a Debian Developer. Yay! :-) My thanks go to As Bernd cited in his AM report, my earliest activity within the Debian community I can remember was organising the Debian booth at LinuxDay.lu 2003, where I installed Debian 3.0 Woody on my Hamilton Hamstation hy (a Sun SparcStation 4 clone). I wrote my first bugreport in November 2004 (#283365), probably during the Sarge BSP in Frankfurt. And my first Debian package was wikipedia2text, starting to package it August 2005 (ITP #325417). My only earlier documented interest in the Debian community is subscribing to the lists debian-apache@l.d.o and debian-emacsen@l.d.o in June 2002. I though remember that I started playing around with Debian 2.0 Hamm, skipping 2.1 (for whatever reasons, I can t remember), using 2.2 quite regularily and started to dive into with Woody which also ran on my first ThinkPad bijou . I installed it over WLAN with just a boot floppy at the Chemnitzer Linux-Tage. :-) Anyway, this has led to what it had to lead to a new Debian Developer. :-) The first package I uploaded with my newly granted rights was a new conkeror snapshot. This version should work out of the box on Ubuntu again, so that conkeror in Ubuntu should not lag that much behind Debian Sid anymore. In other News Since Wednesday I own a Nokia N900 and use it as my primary mobile phone now. Although it s not as free as the OpenMoko (see two other recent posts by Lucas Nussbaum and by Tollef Fog Heen on Planet Debian) it s definitely what I hoped the OpenMoko will once become. And even if I can t run Debian natively on the N900 (yet), it at least has a Debian chroot on it. :-) I'm going to FOSDEM, the Free and Open Source Software Developers' European Meeting A few weeks ago, I took over the organisation of this year s Debian booth at FOSDEM from Wouter Verhelst who s busy enough with FOSDEM organisation itself. Last Monday the organiser of the BSD DevRoom at FOSDEM asked on #mirbsd for talk suggestions and they somehow talked me into giving a talk about Debian GNU/kFreeBSD. The slides should show up during the next days on my Debian GNU/kFreeBSD talks page. I hope, I ll survive that talk despite giving more or less a talk saying Jehova! . ;-) What a week.

28 January 2010

Martin F. Krafft: Adopted passwdqc

Tollef forced me to take over libpam-passwdqc after I had reported bug #517967. passwdqc is a toolset that can be used to enforce password strength policies at exactly the right place: there s a PAM module, and with the next version, you can also use a library and command-line tools read on below. The toolset gives administrators flexibility in defining the minimum password length based on the number of character classes a user tries to use. It also includes libpam-cracklib functionality and prevents the use of trivial passwords. I appreciate this functionality, so I had little choice but to make the best out of it: Debhelper 7 is really nice. Thanks, Joey.

Martin F. Krafft: DistroSummit 2010

Linux.conf.au 2010 has come to an end and I am looking back at an intense week of conferencing. A big shout out to the organisers for their excellent work. I think LCA (as well as DebConf) just keeps getting better every year. This does not at all discredit previous organisers, because they were the best at their times and then passed on the wisdom and experience to help make it even better in the following year. The week started off with the DistroSummit, which Fabio and I organised. Slides are forthcoming, as I failed to get them off the speakers right after their talks it s interesting how stress levels and adrenaline can cause one to forget the most obvious things. This is where experience comes in. I ll be there again next year, I hope, to do things better. The theme of the day was cross-distro collaboration, and we started the day a little bit on the Debian-side with Lucas Nussbaum telling us about quality assurance in Debian, alongside an overview of available resources. We hoped to give people from other distros pointers, and solicit feedback that would enable us to tie quality assurance closer together. Next up was Bdale Garbee who talked about the status of the Linux Standard Base. While I am really interested in such standardisation efforts, I realised during his talks that I had considerable difficulties paying attention because as organiser of the conference, I had all sorts of other things occupying my thoughts. I proceeded to tell the audience the room was mostly filled throughout the day with an estimated 40 50 folks, and I d say about half of them stayed throughout, while the other half came in and left the room between talks. I could not get the projector to work with my laptop after the upgrade to Kernel Mode Setting, and thus used the whiteboard to give a brief introduction to vcs-pkg.org, talk about the current state of affairs, summarise the trends in discussions around patch management and collaboration, give an outlook of what s up next, and solicit some discussion. Sadly, just like during Bdale s talk, I found myself worrying over the organisation of the day, rather than actually taking in most of the discussion. Fortunately, others have written about the most important points, so I defer to them. Michael Homer then told us about GoboLinux s Aliens system, which is a way to integrate domain-specific packages with distro-specific package maintenance e.g. how to get APT to handle CPAN directly, or how to let YUM manage Python packages. The ensuing discussion was interesting, and we carried it over to the next slot, because Scott, the next speaker, was stuck in traffic. To summarise briefly: scripting languages have a lot of NIH-style solutions because it works for them, but these are a nightmare to distro packagers. One symptom of the status quo is that complex software packages like Zimbra are forced to distribute all required components in their installation packages, which make distro packaging, quality assurance, and security support even harder. I don t think we found a solution, other than the need for further standardisation (like the LSB), but the road seems to be a long and windy one. Laszlo Peter introduced the audience to SourceJuicer, a new build system used by OpenSolaris. The idea is that contributors submit packages via a web interface, kicking off a workflow incorporating discussion and vetting, and only after changes have been signed-off are packages forwarded to auto-builders and eventually end up in the package repository. This is very similar to upload ideas I ve had a while ago, which I ve started to (finally) implement. Unfortunately, SourceJuicer seems very specific to OpenSolaris, as well as non-modular, so that I probably won t be able to reuse e.g. the web interface on top of a Debian-specific package builder. After the break, Dustin Kirkland stepped up to tell us about his user experience of Launchpad. Unfortunately, I found his talk a bit too enthusiastic. Launchpad undoubtedly has some very cool features and ideas, but it s just one of the available solutions. The dicussion of Launchpad also dominated the next talk, in which Lucas Nussbaum told us about the Debian-Ubuntu relationship. While his presentation showed that the relationship was improving (Matt Zimmerman made the point that there are rather many relationships, rather than one relationship), I was a bit disturbed by the comments of Launchpad developers in the room, ranging from Debian is declining anyway to Just use Launchpad if you want to collaborate with others and not go down . There was a slight aura of arrogance in their comments which tainted my experience of the otherwise constructive discussions of the day. Overall I had a great time. Debian and Ubuntu made up the vast majority of attendants, with only a handful of representatives from other distros present. I wonder why that would be. One reason might be that around 70% of LCA attendants declared themselves Debian or Ubuntu users, and so there weren t many other distros around. Another might be that I still haven t spread the word enough. Let s hope to do better next year! Thanks to all the speakers. We may have organised the day, but you made it happen and interesting! Slides and recordings of the talks will be linked from the archived website when they become available (yes, the archive page does not exist yet either).

25 January 2010

Martin F. Krafft: Vodafone intransparency

Coming to New Zealand for an extended period of time, I figured it would make sense to purchase a prepay mobile plan to make it easier to mix with locals. Not knowing better, I went with Vodafone, which I whole-heartedly regret: their website is a massive pain in the ass, their price plans completely over the top, and their customer service representative incompetent and unfriendly. My latest experience eclipsed all previous encounters, and makes me want to tell you about it: Between all the obscure add-ons Vodafone threw at me when I bought this SIM card, two weeks ago I couldn t figure out how my balance had decreased from $30 to $0 when I rarely ever made calls. I wrote an e-mail to their customer service hotline, and it took them a week to get back to me, with the following text:
Due to being a Prepay Customer, unfortunately usage details are not available as per terms and conditions. I have although checked your usage and can confirm that all charges are correct.
Obviously, I wasn t going to accept this claim of omniscience, so when last weekend, $20 disappeared over the course of a day, that was the catalyst for me to reopen the ticket and reply along the lines of:
Only I know when I used my phone and thus only I can determine whether the charges are correct. Please show the full records to me, or else
This seemed to convince the representative, and 8 messages and 11 days after my initial request, I was told I could request the records at $5/30 records. Yes, you read that right: they wanted to charge me to view the records. I thus replied:
I am NOT willing to pay for that. If you are unable to comply with my desire for transparency, then I shall terminate the contract and make sure to inform the media as well as the consumer institute of this conduct. As stated previously, I shall also consult with a lawyer. Charging consumers to view data that is obviously available is a strong indication that you do not want me to see it. I can t imaging why this would be the case other than the data being inconsistent with reality.
That worked, and I finally got an Excel sheet with my usage data, which allowed me to track down the depletion of my account: to lure customers in, they promise free calls to other Vodafone numbers for the first four weekends. There are three problems with that though:
  1. Having purchased my card on Saturday afternoon, I was annoyed to find out that the remaining 34 hours of that weekend would be counted as a whole weekend.
  2. They don t provide a way by which to find out whether a given number actually belongs to Vodafone or not. The 021 prefix is not enough of an indication.
  3. They don t actually tell you anywhere but the aforementioned horrific website that the addon has expired.
So thanks, Vodafone. You ve lost a customer, who should have gone with 2degrees in the first place, who have much lower rates, even though their data coverage doesn t seem as good. I don t need data anyway. I ll still insist on Vodafone providing the data in a Free format. You can find more information about NZ mobile phone providers on the LCA2010 wiki page. NP: Age Pryor: Shank s Pony

Next.

Previous.